31 research outputs found
Extreme 3D Face Reconstruction: Seeing Through Occlusions
Existing single view, 3D face reconstruction methods can produce beautifully
detailed 3D results, but typically only for near frontal, unobstructed
viewpoints. We describe a system designed to provide detailed 3D
reconstructions of faces viewed under extreme conditions, out of plane
rotations, and occlusions. Motivated by the concept of bump mapping, we propose
a layered approach which decouples estimation of a global shape from its
mid-level details (e.g., wrinkles). We estimate a coarse 3D face shape which
acts as a foundation and then separately layer this foundation with details
represented by a bump map. We show how a deep convolutional encoder-decoder can
be used to estimate such bump maps. We further show how this approach naturally
extends to generate plausible details for occluded facial regions. We test our
approach and its components extensively, quantitatively demonstrating the
invariance of our estimated facial details. We further provide numerous
qualitative examples showing that our method produces detailed 3D face shapes
in viewing conditions where existing state of the art often break down.Comment: Accepted to CVPR'18. Previously titled: "Extreme 3D Face
Reconstruction: Looking Past Occlusions
Effective Face Frontalization in Unconstrained Images
"Frontalization" is the process of synthesizing frontal facing views of faces
appearing in single unconstrained photos. Recent reports have suggested that
this process may substantially boost the performance of face recognition
systems. This, by transforming the challenging problem of recognizing faces
viewed from unconstrained viewpoints to the easier problem of recognizing faces
in constrained, forward facing poses. Previous frontalization methods did this
by attempting to approximate 3D facial shapes for each query image. We observe
that 3D face shape estimation from unconstrained photos may be a harder problem
than frontalization and can potentially introduce facial misalignments.
Instead, we explore the simpler approach of using a single, unmodified, 3D
surface as an approximation to the shape of all input faces. We show that this
leads to a straightforward, efficient and easy to implement method for
frontalization. More importantly, it produces aesthetic new frontal views and
is surprisingly effective when used for face recognition and gender estimation
Matrix columns allocation problems
AbstractOrthogonal Frequency Division Multiple Access (OFDMA) transmission technique is gaining popularity as a preferred technique in the emerging broadband wireless access standards. Motivated by the OFDMA transmission technique we define the following problem: Let M be a matrix (over R) of size a×b. Given a vector of non-negative integers C→=〈c1,c2,…,cb〉 such that ∑cj=a, we would like to allocate a cells in M such that (i) in each row of M there is a single allocation, and (ii) for each element ci∈C→ there is a unique column in M which contains exactly ci allocations. Our goal is to find an allocation with minimal value, that is, the sum of all the a cells of M which were allocated is minimal. The nature of the suggested new problem is investigated in this paper. Efficient algorithms are suggested for some interesting cases. For other cases of the problem, NP-hardness proofs are given followed by inapproximability results
Recommended from our members
Mutational heterogeneity in cancer and the search for new cancer genes
Major international projects are now underway aimed at creating a comprehensive catalog of all genes responsible for the initiation and progression of cancer. These studies involve sequencing of matched tumor–normal samples followed by mathematical analysis to identify those genes in which mutations occur more frequently than expected by random chance. Here, we describe a fundamental problem with cancer genome studies: as the sample size increases, the list of putatively significant genes produced by current analytical methods burgeons into the hundreds. The list includes many implausible genes (such as those encoding olfactory receptors and the muscle protein titin), suggesting extensive false positive findings that overshadow true driver events. Here, we show that this problem stems largely from mutational heterogeneity and provide a novel analytical methodology, MutSigCV, for resolving the problem. We apply MutSigCV to exome sequences from 3,083 tumor-normal pairs and discover extraordinary variation in (i) mutation frequency and spectrum within cancer types, which shed light on mutational processes and disease etiology, and (ii) mutation frequency across the genome, which is strongly correlated with DNA replication timing and also with transcriptional activity. By incorporating mutational heterogeneity into the analyses, MutSigCV is able to eliminate most of the apparent artefactual findings and allow true cancer genes to rise to attention
An international effort towards developing standards for best practices in analysis, interpretation and reporting of clinical genome sequencing results in the CLARITY Challenge
There is tremendous potential for genome sequencing to improve clinical diagnosis and care once it becomes routinely accessible, but this will require formalizing research methods into clinical best practices in the areas of sequence data generation, analysis, interpretation and reporting. The CLARITY Challenge was designed to spur convergence in methods for diagnosing genetic disease starting from clinical case history and genome sequencing data. DNA samples were obtained from three families with heritable genetic disorders and genomic sequence data were donated by sequencing platform vendors. The challenge was to analyze and interpret these data with the goals of identifying disease-causing variants and reporting the findings in a clinically useful format. Participating contestant groups were solicited broadly, and an independent panel of judges evaluated their performance.
RESULTS:
A total of 30 international groups were engaged. The entries reveal a general convergence of practices on most elements of the analysis and interpretation process. However, even given this commonality of approach, only two groups identified the consensus candidate variants in all disease cases, demonstrating a need for consistent fine-tuning of the generally accepted methods. There was greater diversity of the final clinical report content and in the patient consenting process, demonstrating that these areas require additional exploration and standardization.
CONCLUSIONS:
The CLARITY Challenge provides a comprehensive assessment of current practices for using genome sequencing to diagnose and report genetic diseases. There is remarkable convergence in bioinformatic techniques, but medical interpretation and reporting are areas that require further development by many groups